A comparison of the computational performance of Iteratively Reweighted Least Squares and alternating minimization algorithms for ℓ1 inverse problems

نویسندگان

  • Paul Rodriguez
  • Brendt Wohlberg
چکیده

Alternating minimization algorithms with a shrinkage step, derived within the Split Bregman (SB) or Alternating Direction Method of Multipliers (ADMM) frameworks, have become very popular for `-regularized problems, including Total Variation and Basis Pursuit Denoising. It appears to be generally assumed that they deliver much better computational performance than older methods such as Iteratively Reweighted Least Squares (IRLS). We show, however, that IRLS type methods are computationally competitive with SB/ADMM methods for a variety of problems, and in some cases outperform them.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Performance evaluation of typical approximation algorithms for nonconvex ℓp-minimization in diffuse optical tomography.

The sparse estimation methods that utilize the ℓp-norm, with p being between 0 and 1, have shown better utility in providing optimal solutions to the inverse problem in diffuse optical tomography. These ℓp-norm-based regularizations make the optimization function nonconvex, and algorithms that implement ℓp-norm minimization utilize approximations to the original ℓp-norm function. In this work, ...

متن کامل

Piecewise Differentiable Minimization for Ill-posed Inverse Problems

Based on minimizing a piecewise differentiable lp function subject to a single inequality constraint, this paper discusses algorithms for a discretized regularization problem for ill-posed inverse problems. We examine computational challenges of solving this regularization problem. Possible minimization algorithms such as the steepest descent method, iteratively weighted least squares (IRLS) me...

متن کامل

A comparison of typical ℓp minimization algorithms

Recently, compressed sensing has been widely applied to various areas such as signal processing, machine learning, and pattern recognition. To find the sparse representation of a vector w.r.t. a dictionary, an l1 minimization problem, which is convex, is usually solved in order to overcome the computational difficulty. However, to guarantee that the l1 minimizer is close to the sparsest solutio...

متن کامل

A Projected Alternating Least square Approach for Computation of Nonnegative Matrix Factorization

Nonnegative matrix factorization (NMF) is a common method in data mining that have been used in different applications as a dimension reduction, classification or clustering method. Methods in alternating least square (ALS) approach usually used to solve this non-convex minimization problem.  At each step of ALS algorithms two convex least square problems should be solved, which causes high com...

متن کامل

Piecewise Differentiable Minimization for Ill-posed Inverse Problems Ing from the National Science Foundation and Ibm Corporation, with Additional Support from New York State and Members of Its Corporate Research Institute. 1

Based on minimizing a piecewise diierentiable lp function subject to a single inequality constraint, this paper discusses algorithms for a discretized regularization problem for ill-posed inverse problems. We examine computational challenges of solving this regularization problem. Possible minimization algorithms such as the steepest descent method, iteratively weighted least squares (IRLS) met...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012